# low-resource optimization

Nllb 200 Distilled 1.3B Ct2 Int8
NLLB-200 Distilled 1.3B is the distilled version of Meta's No Language Left Behind (NLLB) project, supporting translation tasks between 200 languages.
Machine Translation Transformers Supports Multiple Languages
N
winstxnhdw
7,666
4
Indobert IndoNLU QA
MIT
A model fine-tuned on Indonesian Q&A tasks based on IndoBERT, suitable for Indonesian natural language understanding tasks
Question Answering System Transformers
I
Rifky
40
1
Wav2vec2 Base Common Voice Persian Colab
Apache-2.0
This model is a fine-tuned speech recognition model based on facebook/wav2vec2-base for Persian language datasets, primarily used for Persian speech-to-text tasks.
Speech Recognition Transformers
W
zoha
21
0
Wav2vec2 Base Common Voice Fa Demo Colab
Apache-2.0
This model is a Persian speech recognition model fine-tuned based on facebook/wav2vec2-base, suitable for Persian speech-to-text tasks.
Speech Recognition Transformers
W
zoha
15
0
Wav2vec2 Xls R 300m Lm Hebrew
Apache-2.0
Hebrew speech recognition model fine-tuned from facebook/wav2vec2-xls-r-300m with n-gram language model enhancement
Speech Recognition Transformers Other
W
imvladikon
21
1
Wav2vec2 Base Russian Big Kaggle
Apache-2.0
This model is a speech recognition model fine-tuned on Russian datasets based on facebook/wav2vec2-base
Speech Recognition Transformers
W
Eyvaz
17
1
Wav2vec2 Base Russian Modified Kaggle
Apache-2.0
This model is a fine-tuned version of facebook/wav2vec2-base on an unspecified dataset, suitable for Russian speech processing tasks.
Speech Recognition Transformers
W
Eyvaz
16
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase